Training Multilayer Perceptrons with the Extende Kalman Algorithm

نویسندگان

  • Sharad Singhal
  • Lance Wu
چکیده

A large fraction of recent work in artificial neural nets uses multilayer perceptrons trained with the back-propagation algorithm described by Rumelhart et. a1. This algorithm converges slowly for large or complex problems such as speech recognition, where thousands of iterations may be needed for convergence even with small data sets. In this paper, we show that training multilayer perceptrons is an identification problem for a nonlinear dynamic system which can be solved using the Extended Kalman Algorithm. Although computationally complex, the Kalman algorithm usually converges in a few iterations. We describe the algorithm and compare it with back-propagation using twodimensional examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recurrent Multilayer Perceptrons for Identiication and Control: the Road to Applications

This study investigates the properties of artiicial recurrent neural networks. Particular attention is paid to the question of how these nets can be applied to the identiication and control of non-linear dynamic processes. Since these kind of processes can only insuuciently be modelled by conventional methods, diierent approaches are required. Neural networks are considered to be useful for thi...

متن کامل

Regularisation in Sequential Learning Algorithms

In this paper, we discuss regularisation in online/sequential learning algorithms. In environments where data arrives sequentially, techniques such as cross-validation to achieve regularisation or model selection are not possible. Further, bootstrapping to determine a confidence level is not practical. To surmount these problems, a minimum variance estimation approach that makes use of the exte...

متن کامل

Direct Method for Training Feed-Forward Neural Networks Using Batch Extended Kalman Filter for Multi-Step-Ahead Predictions

This paper is dedicated to the long-term, or multi-step-ahead, time series prediction problem. We propose a novel method for training feed-forward neural networks, such as multilayer perceptrons, with tapped delay lines. Special batch calculation of derivatives called Forecasted Propagation Through Time and batch modification of the Extended Kalman Filter are introduced. Experiments were carrie...

متن کامل

Dual Extended Kalman Filter Algorithm for Training RBF Networks

where φ is a nonlinear function selected from a set of typical ones, || . || denotes the Euclidean norm, wi are the tap weights and C ∈ R are called RBF centers. It is easy to see that the formula above is equivalent to a special form of a 2-layer perceptron, which is linear in the parameters by fixing all the centers and nonlinearities in the hidden layer. The output layer simply performs a li...

متن کامل

Comparing Hybrid Systems to Design and Optimize Artificial Neural Networks

In this paper we conduct a comparative study between hybrid methods to optimize multilayer perceptrons: a model that optimizes the architecture and initial weights of multilayer perceptrons; a parallel approach to optimize the architecture and initial weights of multilayer perceptrons; a method that searches for the parameters of the training algorithm, and an approach for cooperative co-evolut...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1988